THE OFFICE OF COMPLIANCE & PRIVACY SERVICES
 
 
Inside UVM – July 11, 2023
 
Privacy Matters Newsletter
 
NON-PUBLIC PROTECTED DATA (NPPD) AND ARTIFICIAL INTELLIGENCE (AI)
 

OpenAI, ChatGPT – What Does It Mean?
 
A lot has been written recently regarding chatbots, artificial intelligence (AI) and various tools; the most common of which is currently ChatGPT operated by OpenAI. If you’ve been following the news, you’ve likely read about some of the privacy concerns that have been raised by advancements in, and broad use of, AI tools.

You should already be familiar with UVM’s Privacy and Information Security Policies. Both policies require that, in accordance with applicable federal, state, and international laws, we limit the access, use, and disclosure of Non-Public Protected Data (NPPD). Plus, protecting someone’s privacy is just the right thing to do.

What is NPPD?
 
NPPD is information that is not intended to be public. It is a UVM term that includes personal, protected, confidential, and proprietary information. A full list of what NPPD consists of can be found in the definitions section of the Privacy Policy.

What do UVM’s policies have to do with artificial intelligence (AI) tools?
 
If NPPD is disclosed without authorization, any number of laws could be violated. AI tools are not authorized for use by UVM. UVM does not have a contractual relationship with these AI companies. These tools are neither private nor secure. In fact, for illustration only, ChatGPT tells users in their Privacy Policy the many ways that they will use user content.

Bottom line... entering NPPD into an AI tool constitutes an unauthorized disclosure. If information that you’ve entered into an AI tool includes any NPPD, you may have violated someone’s privacy rights, created a reportable breach, and violated federal, state, and/or international law.

What are some examples?
 
  • Let’s say that you attended a meeting to discuss financial aid. You cut and paste your notes into an AI tool and ask it to create a summary of your notes. Your notes include some individually identifiable student information. This is an unauthorized disclosure that may violate the Family Educational Rights and Privacy Act (FERPA) and/or the Gramm-Leach-Bliley Act (GLBA).

  • You take a data set that was created during your research. The data includes individually identifiable information. You upload the data and ask the tool to develop PowerPoint slides. This is an unauthorized disclosure that may violate the Health Insurance Portability and Accountability Act (HIPAA). This also may need to be reported to the individuals, to government regulators and/or to law enforcement.

  • You just completed a search to fill an open position in your unit. You copy demographics that includes some personally identifiable data elements. You paste it in the AI tool and ask it to generate an offer letter*. This is an unauthorized disclosure that may violate both state and federal laws. Under Vermont state law, it might also need to be reported to the individuals, to government agencies, and/or to law enforcement.

If you have any questions about this PRIVACYMATTERS article, contact the Chief Privacy Officer. If you have questions about information security, contact the Information Security Officer.



*This is used as an illustration only. Units should not be sending offer letters for staff on their own. Staff offer letters are now sent by Human Resource Services.